Learning Rates for Classification with Gaussian Kernels

نویسندگان

  • Shaobo Lin
  • Jinshan Zeng
  • Xiangyu Chang
چکیده

This letter aims at refined error analysis for binary classification using support vector machine (SVM) with gaussian kernel and convex loss. Our first result shows that for some loss functions, such as the truncated quadratic loss and quadratic loss, SVM with gaussian kernel can reach the almost optimal learning rate provided the regression function is smooth. Our second result shows that for a large number of loss functions, under some Tsybakov noise assumption, if the regression function is infinitely smooth, then SVM with gaussian kernel can achieve the learning rate of order [Formula: see text], where [Formula: see text] is the number of samples.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Classification with Gaussians and Convex Loss

This paper considers binary classification algorithms generated from Tikhonov regularization schemes associated with general convex loss functions and varying Gaussian kernels. Our main goal is to provide fast convergence rates for the excess misclassification error. Allowing varying Gaussian kernels in the algorithms improves learning rates measured by regularization error and sample error. Sp...

متن کامل

Learnability of Gaussians with Flexible Variances

Gaussian kernels with flexible variances provide a rich family of Mercer kernels for learning algorithms. We show that the union of the unit balls of reproducing kernel Hilbert spaces generated by Gaussian kernels with flexible variances is a uniform Glivenko-Cantelli (uGC) class. This result confirms a conjecture concerning learnability of Gaussian kernels and verifies the uniform convergence ...

متن کامل

Recognizing the Emotional State Changes in Human Utterance by a Learning Statistical Method based on Gaussian Mixture Model

Speech is one of the most opulent and instant methods to express emotional characteristics of human beings, which conveys the cognitive and semantic concepts among humans. In this study, a statistical-based method for emotional recognition of speech signals is proposed, and a learning approach is introduced, which is based on the statistical model to classify internal feelings of the utterance....

متن کامل

Optimal regression rates for SVMs using Gaussian kernels

Support vector machines (SVMs) using Gaussian kernels are one of the standard and state-of-the-art learning algorithms. In this work, we establish new oracle inequalities for such SVMs when applied to either least squares or conditional quantile regression. With the help of these oracle inequalities we then derive learning rates that are (essentially) minmax optimal under standard smoothness as...

متن کامل

SVM Learning and L Approximation by Gaussians on Riemannian Manifolds

We confirm by the multi-Gaussian support vector machine (SVM) classification that the intrinsic dimension of Riemannian manifolds improves the efficiency (learning rates) of learning algorithms. The essential analysis lies in the study of approximation in Lp (1 ≤ p < ∞) of Lp functions by their convolutions with the Gaussian kernel with variance σ → 0. This covers the SVM case when the approxim...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural computation

دوره 29 12  شماره 

صفحات  -

تاریخ انتشار 2017